Canonical Correlation Clarified by Singular Value Decomposition
نویسنده
چکیده
You want to find a linear combination of the x coordinates that correlates well over the data with an (in general, different) linear combination of the y coordinates. In fact, you want to find the best such matched pair of linear combinations on the x and y sides, that is, the one yielding the largest coefficient of correlation. But why stop there? Once you have the best pair, you can ask for the second-best pair, that is, the linear combination of x coordinates in the subspace orthogonal to the first combination (on the x side) that best correlates with a linear combination of y coordinates in the subspace orthogonal to the first combination (on the y side). And so on for further pairs or linear combinations in remaining orthogonal subspaces. We can proceed like this until we have exhaused the orthogonal subspaces on either the x or the y side, whichever comes first. So the number of pairs (termed the number of canonical coordinates) is d = min[rank(x), rank(y)], where rank means column rank. We denote the matrices of coefficients of the linear combinations by a (on the x side), of size p1 × d, and b (on the y side), of size p2 × d. If matrices u and v denote the linear combinations evaluated for each data point, we have
منابع مشابه
Canonical Correlation Analysis
We discuss algorithms for performing canonical correlation analysis. In canonical correlation analysis we try to find correlations between two data sets. The canonical correlation coefficients can be calculated directly from the two data sets or from (reduced) representations such as the covariance matrices. The algorithms for both representations are based on singular value decomposition. The ...
متن کاملOn the optimal approximation for the symmetric Procrustes problems of the matrix equation AXB = C
The explicit analytical expressions of the optimal approximation solutions for the symmetric Procrustes problems of the linear matrix equation AXB = C are derived, with the projection theorem in Hilbert space , the quotient singular value decomposition (QSVD) and the canonical correlation decomposition (CCD) being used.
متن کاملFunctional Singular Component Analysis
Aiming at quantifying the dependency of pairs of functional data (X,Y ), we develop the concept of functional singular value decomposition for covariance and functional singular component analysis, building on the concept of “canonical expansion” of compact operators in functional analysis. We demonstrate the estimation of the resulting singular values, functions and components for the practica...
متن کاملLarge Scale Canonical Correlation Analysis with Iterative Least Squares
Canonical Correlation Analysis (CCA) is a widely used statistical tool with both well established theory and favorable performance for a wide range of machine learning problems. However, computing CCA for huge datasets can be very slow since it involves implementing QR decomposition or singular value decomposition of huge matrices. In this paper we introduce L-CCA , a iterative algorithm which ...
متن کاملSingular Value Decomposition and the Centrality of Löwdin Orthogonalizations
The different orthogonal relationship that exists in the Löwdin orthogonalizat ions is presented. Other orthogonalizat ion techniques such as polar decomposition (PD), principal component analysis (PCA) and reduced singular value decomposition (SVD) can be derived from Löwdin methods. It is analytically shown that the polar decomposition is presented in the symmetric o rthogonalization; princip...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2011